Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Terraform changes to use existing VPC #22

Draft
wants to merge 8 commits into
base: main
Choose a base branch
from

Conversation

smohiudd
Copy link
Contributor

These terraform change allow using an existing VPC instead of creating a new one.

@smohiudd smohiudd requested a review from ranchodeluxe May 24, 2023 15:55
name = var.ecr_repository_name
}
# data "aws_ecr_repository" "service" {
# count = var.use_ecr ? 1 : 0
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is a good change IMHO 🥳 But let's remove the use_ecs in the variables.tf also then delete this block and keep your other change in this module. Nice work

Copy link
Contributor Author

@smohiudd smohiudd May 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We still need use_ecr in other parts of main. For example:

. So should we still keep it?

Copy link
Contributor

@ranchodeluxe ranchodeluxe May 24, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Doh, I didn't check that. I don't think we need it. What we're basically doing here is saying, "let's just create the role with all the permissions we think it needs for ECR too" instead of having the user tell us at the function call time whether they need it or not. I think that's fine.

But are you comfortable changing all those things? If not just abort all changes and I can handle it in the future

@@ -0,0 +1,13 @@
# region = "us-west-2"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you shouldn't have to worry about commenting var files out. They won't be used b/c you should be specifically passing them in plan/apply via the -var-file=./vars/<name>.tf argument. Maybe take a look at this document and the last command:

https://github.com/NASA-IMPACT/veda-features-api/blob/main/docs/IACHOWTO.md

availability_zones = ["us-west-2a", "us-west-2b"]
service_port = 8080
dns_zone_name = "delta-backend.com"
dns_subdomain = "firenrt"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

as we talked about you'll probably want to change the dns_subdomain to ghg or something and check what dns_zone_name is available in MCP for ghg

}
}
backend "s3" {
bucket = "veda-wfs3-tf-state-bucket"
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you'll need to manually create a new S3 bucket in this region and change this name so we don't clobber this state bucket

Copy link
Contributor

@ranchodeluxe ranchodeluxe left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice work duder 🎉

I left specific comments below. Here are some general comments.

  1. I'd rename features-api folder to something more specific such as ghg-features-api-shared-vpc since your variables will probably change and require us to pass in VPC ids, subnet ids, security group ids, etc, etc

  2. I'd advise reading over https://github.com/NASA-IMPACT/veda-features-api/blob/main/docs/IACHOWTO.md and especially using tfenv and workspaces in your flows. You'll need to change the bucket name in init.tf before you do anything else in that workflow or it won't work

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants